717 research outputs found

    Cultural integration and its discontents

    Get PDF
    social integrationculture

    Computer-aided verification in mechanism design

    Full text link
    In mechanism design, the gold standard solution concepts are dominant strategy incentive compatibility and Bayesian incentive compatibility. These solution concepts relieve the (possibly unsophisticated) bidders from the need to engage in complicated strategizing. While incentive properties are simple to state, their proofs are specific to the mechanism and can be quite complex. This raises two concerns. From a practical perspective, checking a complex proof can be a tedious process, often requiring experts knowledgeable in mechanism design. Furthermore, from a modeling perspective, if unsophisticated agents are unconvinced of incentive properties, they may strategize in unpredictable ways. To address both concerns, we explore techniques from computer-aided verification to construct formal proofs of incentive properties. Because formal proofs can be automatically checked, agents do not need to manually check the properties, or even understand the proof. To demonstrate, we present the verification of a sophisticated mechanism: the generic reduction from Bayesian incentive compatible mechanism design to algorithm design given by Hartline, Kleinberg, and Malekian. This mechanism presents new challenges for formal verification, including essential use of randomness from both the execution of the mechanism and from the prior type distributions. As an immediate consequence, our work also formalizes Bayesian incentive compatibility for the entire family of mechanisms derived via this reduction. Finally, as an intermediate step in our formalization, we provide the first formal verification of incentive compatibility for the celebrated Vickrey-Clarke-Groves mechanism

    Development of airborne eddy-correlation flux measurement capabilities for reactive oxides of nitrogen

    Get PDF
    This research is aimed at producing a fundamental new research tool for characterizing the source strength of the most important compound controlling the hemispheric and global scale distribution of tropospheric ozone. Specifically, this effort seeks to demonstrate the proof-of-concept of a new general purpose laser-induced fluorescence based spectrometer for making airborne eddy-correlation flux measurements of nitric oxide (NO) and other reactive nitrogen compounds. The new all solid-state laser technology being used in this advanced sensor will produce a forerunner of the type of sensor technology that should eventually result in highly compact operational systems. The proof-of-concept sensor being developed will have over two orders-of-magnitude greater sensitivity than present-day instruments. In addition, this sensor will offer the possibility of eventual extension to airborne eddy-correlation flux measurements of nitrogen dioxide (NO2) and possibly other compounds, such as ammonia (NH3), peroxyradicals (HO2), nitrateradicals (NO3) and several iodine compounds (e.g., I and IO). Demonstration of the new sensor's ability to measure NO fluxes will occur through a series of laboratory and field tests. This proof-of-concept demonstration will show that not only can airborne fluxes of important ultra-trace compounds be made at the few parts-per-trillion level, but that the high accuracy/precision measurements currently needed for predictive models can also. These measurement capabilities will greatly enhance our current ability to quantify the fluxes of reactive nitrogen into the troposphere and significantly impact upon the accuracy of predictive capabilities to model O3's distribution within the remote troposphere. This development effort also offers a timely approach for producing the reactive nitrogen flux measurement capabilities that will be needed by future research programs such as NASA's planned 1999 Amazon Biogeochemistry and Atmospheric Chemistry Experimental portion of LBA

    Chemical NOx budget in the upper troposphere over the tropical South Pacific

    Get PDF
    The chemical NOx budget in the upper troposphere over the tropical South Pacific is analyzed using aircraft measurements made at 6-12 km altitude in September 1996 during the Global Tropospheric Experiment (GTE) Pacific Exploratory Mission (PEM) Tropics A campaign. Chemical loss and production rates of NOx along the aircraft flight tracks are calculated with a photochemical model constrained by observations. Calculations using a standard chemical mechanism show a large missing source for NOx; chemical loss exceeds chemical production by a factor of 2.4 on average. Similar or greater NOx budget imbalances have been reported in analyses of data from previous field studies. Ammonium aerosol concentrations in PEM-Tropics A generally exceeded sulfate on a charge equivalent basis, and relative humidities were low (median 25% relative to ice). This implies that the aerosol could be dry in which case N2O5 hydrolysis would be suppressed as a sink for NOx. Suppression of N2O5 hydrolysis and adoption of new measurements of the reaction rate constants for NO2 + OH + M and HNO3 + OH reduces the median chemical imbalance in the NOx budget for PEM-Tropics A from 2.4 to 1.9. The remaining imbalance cannot be easily explained from known chemistry or long-range transport of primary NOx and may imply a major gap in our understanding of the chemical cycling of NOx in the free troposphere. Copyright 2000 by the American Geophysical Union

    Sequential Posted Price Mechanisms with Correlated Valuations

    Full text link
    We study the revenue performance of sequential posted price mechanisms and some natural extensions, for a general setting where the valuations of the buyers are drawn from a correlated distribution. Sequential posted price mechanisms are conceptually simple mechanisms that work by proposing a take-it-or-leave-it offer to each buyer. We apply sequential posted price mechanisms to single-parameter multi-unit settings in which each buyer demands only one item and the mechanism can assign the service to at most k of the buyers. For standard sequential posted price mechanisms, we prove that with the valuation distribution having finite support, no sequential posted price mechanism can extract a constant fraction of the optimal expected revenue, even with unlimited supply. We extend this result to the the case of a continuous valuation distribution when various standard assumptions hold simultaneously. In fact, it turns out that the best fraction of the optimal revenue that is extractable by a sequential posted price mechanism is proportional to ratio of the highest and lowest possible valuation. We prove that for two simple generalizations of these mechanisms, a better revenue performance can be achieved: if the sequential posted price mechanism has for each buyer the option of either proposing an offer or asking the buyer for its valuation, then a Omega(1/max{1,d}) fraction of the optimal revenue can be extracted, where d denotes the degree of dependence of the valuations, ranging from complete independence (d=0) to arbitrary dependence (d=n-1). Moreover, when we generalize the sequential posted price mechanisms further, such that the mechanism has the ability to make a take-it-or-leave-it offer to the i-th buyer that depends on the valuations of all buyers except i's, we prove that a constant fraction (2-sqrt{e})/4~0.088 of the optimal revenue can be always be extracted.Comment: 29 pages, To appear in WINE 201

    Large-scale distributions of tropospheric nitric, formic, and acetic acids over the western Pacific basin during wintertime

    Get PDF
    We report here measurements of the acidic gases nitric (HNO3), formic (HCOOH), and acetic (CH3COOH) over the western Pacific basin during the February-March 1994 Pacific Exploratory Mission-West (PEM-West B). These data were obtained aboard the NASA DC-8 research aircraft as it flew missions in the altitude range of 0.3–12.5 km over equatorial regions near Guam and then further westward encompassing the entire Pacific Rim arc. Aged marine air over the equatorial Pacific generally exhibited mixing ratios of acidic gases \u3c100 parts per trillion by volume (pptv). Near the Asian continent, discrete plumes encountered below 6 km altitude contained up to 8 parts per billion by volume (ppbv) HNO3 and 10 ppbv HCOOH and CH3COOH. Overall there was a general correlation between mixing ratios of acidic gases with those of CO, C2H2, and C2Cl4, indicative of emissions from combustion and industrial sources. The latitudinal distributions of HNO3 and CO showed that the largest mixing ratios were centered around 15°N, while HCOOH, CH3COOH, and C2Cl4 peaked at 25°N. The mixing ratios of HCOOH and CH3COOH were highly correlated (r2 = 0.87) below 6 km altitude, with a slope (0.89) characteristic of the nongrowing season at midlatitudes in the northern hemisphere. Above 6 km altitude, HCOOH and CH3COOH were marginally correlated (r2 = 0.50), and plumes well defined by CO, C2H2, and C2Cl4 were depleted in acidic gases, most likely due to scavenging during vertical transport of air masses through convective cloud systems over the Asian continent. In stratospheric air masses, HNO3 mixing ratios were several parts per billion by volume (ppbv), yielding relationships with O3 and N2O consistent with those previously reported for NOy

    Evaluating regional emission estimates using the TRACE-P observations

    Get PDF
    Measurements obtained during the NASA Transport and Chemical Evolution over the Pacific (TRACE-P) experiment are used in conjunction with regional modeling analysis to evaluate emission estimates for Asia. A comparison between the modeled values and the observations is one method to evaluate emissions. Based on such analysis it is concluded that the inventory performs well for the light alkanes, CO, ethyne, SO2, and NOₓ. Furthermore, based on model skill in predicting important photochemical species such as O₃, HCHO, OH, HO₂, and HNO₃, it is found that the emissions inventories are of sufficient quality to support preliminary studies of ozone production. These are important finding in light of the fact that emission estimates for many species (such as speciated NMHCs and BC) for this region have only recently been estimated and are highly uncertain. Using a classification of the measurements built upon trajectory analysis, we compare observed species distributions and ratios of species to those modeled and to ratios estimated from the emissions inventory. It is shown that this technique can reconstruct a spatial distribution of propane/benzene that looks remarkably similar to that calculated from the emissions inventory. A major discrepancy between modeled and observed behavior is found in the Yellow Sea, where modeled values are systematically underpredicted. The integrated analysis suggests that this may be related to an underestimation of emissions from the domestic sector. The emission is further tested by comparing observed and measured species ratios in identified megacity plumes. Many of the model derived ratios (e.g., BC/CO, SOₓ/C₂H₂) fall within ∼25% of those observed and all fall outside of a factor of 2.5. (See Article file for details of the abstract.)Department of Civil and Environmental EngineeringAuthor name used in this publication: Wang, T

    Computing optimal coalition structures in polynomial time

    Get PDF
    The optimal coalition structure determination problem is in general computationally hard. In this article, we identify some problem instances for which the space of possible coalition structures has a certain form and constructively prove that the problem is polynomial time solvable. Specifically, we consider games with an ordering over the players and introduce a distance metric for measuring the distance between any two structures. In terms of this metric, we define the property of monotonicity, meaning that coalition structures closer to the optimal, as measured by the metric, have higher value than those further away. Similarly, quasi-monotonicity means that part of the space of coalition structures is monotonic, while part of it is non-monotonic. (Quasi)-monotonicity is a property that can be satisfied by coalition games in characteristic function form and also those in partition function form. For a setting with a monotonic value function and a known player ordering, we prove that the optimal coalition structure determination problem is polynomial time solvable and devise such an algorithm using a greedy approach. We extend this algorithm to quasi-monotonic value functions and demonstrate how its time complexity improves from exponential to polynomial as the degree of monotonicity of the value function increases. We go further and consider a setting in which the value function is monotonic and an ordering over the players is known to exist but ordering itself is unknown. For this setting too, we prove that the coalition structure determination problem is polynomial time solvable and devise such an algorithm
    corecore